IEEE Transactions on Neural Systems and Rehabilitation Engineering
● Institute of Electrical and Electronics Engineers (IEEE)
Preprints posted in the last 7 days, ranked by how well they match IEEE Transactions on Neural Systems and Rehabilitation Engineering's content profile, based on 40 papers previously published here. The average preprint has a 0.03% match score for this journal, so anything above that is already an above-average fit.
Hosseini-Yazdi, S.-S.; Fitzsimons, K.; Bertram, J. E.
Show abstract
Walking speed is widely used to assess gait recovery following stroke, yet it provides limited insight into how walking performance is mechanically organized. This study examined how center of mass (COM) work organization and propulsion-support coupling vary across walking speeds in individuals with post stroke hemiparesis to distinguish recovery of gait organization from recovery of limb level mechanical function. Eleven individuals with post stroke hemiparesis performed treadmill walking across speeds ranging from 0.2 to 0.7 m/s while ground reaction forces were recorded. Limb specific COM power and work were computed using an individual limbs framework, and interlimb asymmetry in net and positive work, along with the propulsion-support ratio (PSR), were quantified. A qualitative transition in gait organization was observed: at lower walking speeds, COM power exhibited a simplified two phase pattern, whereas at higher walking speeds (approximately >=0.5 m/s), a structured four phase COM power pattern emerged, including identifiable push off and preload phases. Despite this recovery of gait organization, interlimb work asymmetry remained elevated and paretic PSR remained reduced across all speeds, indicating persistent limb level mechanical deficits. These findings demonstrate that increases in walking speed and the emergence of typical COM power structure reflect recovery of gait organization rather than restoration of underlying limb level mechanical capacity. Consequently, walking speed alone is insufficient to characterize gait recovery after stroke, and biomechanically informed measures of COM work organization and propulsion-support coupling provide complementary insight by distinguishing organizational recovery from limb-level mechanical recovery.
Chafetz, R.; Warshauer, S.; Waldron, S.; Kruger, K. M.; Donahue, S.; Bauer, J. P.; Sienko, S.; Bagley, A.; Courter, R.
Show abstract
Markerless motion capture has emerged as a potential substitute for traditional marker-based systems, offering scalable, non-invasive acquisition of human movement. Despite increasing adoption in research and sports applications, its clinical utility for children with complex gait patterns remains an open question. To address this gap, simultaneous marker-based and markerless data were collected in 202 pediatric children (12.1 {+/-} 3.9 years). Marker-based kinematics were processed using the Shriners Children's Gait Model (SCGM), while markerless outputs were computed using Theia3D with identical Cardan sequences. Agreement between systems was evaluated using statistical parametric mapping (SPM), root-mean-square error (RMSE), and a gait pattern classification based on the plantarflexor-knee extension index. Markerless output systematically underestimated pelvic tilt, hip rotation, and knee rotation and demonstrated reduced between-subject variance in the transverse plane. SPM revealed widespread waveform differences, although most were of negligible effect, especially in the sagittal plane. Mean sagittal-plane RMSEs were < 5{degrees} for the knee and ankle and < 8{degrees} for the pelvis and hip. Coronal-plane deviations were < 7{degrees}, whereas transverse-plane errors exceeded 10{degrees}. RMSE increased significantly with body mass index and use of a walker (p < 0.001). Agreement in sagittal-plane gait classification was moderate between systems ({kappa} = 0.60; 67% overall concordance). These results indicate that markerless motion capture is suitable for analyses emphasizing sagittal deviations but remains limited for applications requiring precise axial or frontal-plane estimation. Future work should address algorithmic underestimation of transverse motion and evaluate markerless performance across increasing severity of gait deviation.
Valestrino, K. J.; Ihediwa, C. V.; Dorius, G. T.; Conger, A. M.; Glinka-Przybysz, A.; McCormick, Z. L.; Fogarty, A. E.; Mahan, M. A.; Hernandez-Bello, J.; Konrad, P. E.; Burnham, T. R.; Dalrymple, A. N.
Show abstract
ObjectivesEpidural spinal cord stimulation (SCS) is an emerging therapy for motor rehabilitation following spinal cord injury (SCI) and other motor disorders. Conventionally, SCS leads are placed along the dorsal spinal cord (SCSD), where stimulation activates large diameter afferent fibers, which indirectly activate motoneurons through reflex pathways. This leads to broad activation of flexor and extensor muscles and limited fine-tuned control of motor output. Targeting the ventral spinal cord (SCSV) may enable more direct activation of motoneuron pools, potentially improving the specificity of muscle activation; however, there is currently no established method to place leads ventrally. To address this, we evaluated the feasibility of four modified percutaneous implantation techniques to target the ventrolateral thoracolumbar spinal cord. Materials and methodsPercutaneous SCSV implantation was performed in three human cadaver torso specimens under fluoroscopic guidance. The following approaches were evaluated: sacral hiatus, transforaminal, interlaminar contralateral, and interlaminar ipsilateral. The leads in the latter 3 approaches were inserted between L1 and L5. Eighteen implants were attempted, with nine leads retained for analysis. Lead and electrode position were assessed using computed tomography (CT) with three-dimensional reconstruction, along with anatomical dissection to verify lead and electrode placement within the epidural space. ResultsSuccessful ventral epidural lead placement was achieved using all four implantation approaches. The sacral hiatus (16/16 electrodes) and transforaminal (8/8 electrodes) approaches resulted in exclusively ventrolateral placement. The interlaminar contralateral approach led to 27/32 electrodes positioned ventrolaterally and 5/32 dorsally. The interlaminar ipsilateral implantation approach led to 14/32 electrodes positioned ventrolaterally and 18/32 positioned ventromedially. ConclusionsThese findings demonstrate that ventral epidural SCS lead placement can be achieved using modified percutaneous implant techniques. The four approaches outlined here provide a clinically feasible pathway to SCSV and establishes a foundation for future clinical studies investigating SCSV for motor rehabilitation following SCI.
Yamasaki, F.; Seike, M.; Hirota, T.; Sato, T.
Show abstract
Background: Deep brain stimulation (DBS) is a treatment option for Parkinson disease (PD). However, the effect of DBS on the arterial pressure (AP) remains unexplored. We aimed to develop an artificial baroreflex system for treating orthostatic hypotension (OH) due to central baroreflex failure in patients with PD. To achieve this, we developed an appropriate algorithm after estimating the dynamic responses of the AP to DBS using a white noise system identification method. Methods: We randomly performed DBS while measuring the AP tonometrically in 3 trials involving 3 patients with PD treated with DBS. We calculated the frequency response of the AP to the DBS using a fast Fourier transform algorithm. Finally, the feedback correction factors were determined via numerical simulation. Results: The frequency responses of the systolic AP to random DBS were identifiable in all 3 trials, and the steady state gain was 8.24 mmHg/STM. Based on these results, the proportional correction factor was set to 0.12, and the integral correction factor was set to 0.018. The computer simulation revealed that the system could quickly and effectively attenuate a sudden AP drop induced by external disturbances such as head-up tilting. Conclusion: An artificial baroreflex system with DBS may be a novel therapeutic approach for OH caused by central baroreflex failure.
Varisco, G.; Plantin, J.; Almeida, R.; Palmcrantz, S.; Astrand, E.
Show abstract
Stroke is the third leading cause of death and disability combined worldwide and often results in hemiparesis. Functional magnetic resonance imaging (fMRI) is a non-invasive technique used to investigate changes in brain activations during tasks aimed at restoring the lost motor function. Participants with chronic stroke and residual hemiparesis in the upper extremity were recruited for a clinical intervention that included neurofeedback training and fMRI sessions with motor-execution and motor-imagery tasks. The present study provides a baseline characterization of brain activations prior to neurofeedback training. Since lesion site and volume varied across participants, two fMRI preprocessing pipelines were applied. The first one was used for twelve participants with lesions restricted to a single hemisphere and for one participant with small secondary lesions in the contralesional hemisphere, whereas the second one was used for two participants with large bilateral lesions. These were followed by quality control measures and statistical analysis. First-level (i.e., single-participant) analysis returned the strongest and most extensive activation across participants during motor-execution tasks, with clusters identified in the ipsilesional parietal lobe, bilateral occipital lobes, and cerebellum after Family-Wise Error correction. Second-level (i.e., group-level) analysis involving participants who underwent the first fMRI preprocessing pipeline revealed a significant cluster in the cerebellum after False Discovery Rate correction. These results are consistent with previous studies involving participants with chronic stroke performing motor-tasks. Cerebellar recruitment observed consistently across participants could reflect compensatory mechanisms supporting motor control after stroke.
Emerick, M.; Grahn, J. A.
Show abstract
Walking impairments in Parkinsons disease (PD), including reduced speed, cadence, and stride length, and increased variability, impair mobility and raise fall risk. Conventional treatments may fail to address these deficits, underscoring the need for complementary non-invasive alternatives. This study examined whether combining rhythmic auditory cueing with transcranial direct current stimulation (tDCS) over the supplementary motor area (SMA), a critical region for internally-generated movement, would enhance gait performance in PD. Thirty-three participants with PD and thirty-two healthy controls completed two sessions (anodal vs. sham tDCS) with gait assessed during stimulation, immediately after stimulation, and 15 minutes after stimulation under two auditory conditions: walking in silence and walking to music paced 10% faster than baseline cadence. Spatiotemporal, variability, and stability gait parameters were analyzed using linear mixed-effects models. Rhythmic auditory cueing significantly increased cadence and speed during, immediately after, and especially 15 minutes after stimulation, suggesting sustained effects of rhythmic entrainment. Anodal tDCS produced faster cadence, as well as lower stride time variability and stride width, particularly in individuals with PD. Although both music and anodal tDCS affected gait, no interaction was observed, indicating independent effects. Individuals with PD had greater gait variability overall, and adjusted temporal gait parameters less to music than healthy controls did. Anodal stimulation reduced walking variability in PD, reducing the group differences observed under sham conditions. These findings suggest that rhythmic cueing and SMA stimulation target complementary mechanisms, highlighting the promise of combined tDCS-music interventions for gait rehabilitation in PD.
Ni, N.; Zhao, B.; Wang, Y.; Wang, Q.; Ding, J.; Liu, T.
Show abstract
Abstract The ISBAR framework is used to standardize clinical handovers and enhance patient safety. Observational tools based on ISBAR have been developed to assess the completeness of information transfer. However, these instruments have primarily been developed in non-Chinese contexts, and validated Chinese-language observational tools suitable for clinical practice remain limited. In this study, a cross-cultural adaptation and psychometric validation of the ISBAR Structured Handover Observation Tool was conducted, examining its reliability and discriminant validity in Chinese clinical settings. The study was conducted in two phases: cross-cultural adaptation and psychometric evaluation in real-world clinical settings. Content validity was assessed using the Content Validity Index (CVI), and inter-rater reliability was evaluated using the Intraclass Correlation Coefficient (ICC) based on a two-way mixed-effects model with absolute agreement. Discriminant validity was examined using the Mann-Whitney U test to compare scores across nurses with varying levels of clinical experience. A total of 233 handover cases involving patient transfers from the intensive care unit (ICU) to general wards were collected, involving 84 nurses. The scale demonstrated good content validity, with item-level content validity indices (CVI) ranging from 0.88 to 1.00 and a scale-level CVI/Ave of 0.98. The inter-rater reliability, assessed using fifty randomly selected cases, was high, with an intraclass correlation coefficient (ICC) of 0.885 for single-rater assessments and 0.939 for average-rater assessments. Discriminant validity analysis showed that nurses with more clinical experience had significantly higher total scores than those with less experience (Z = -4.772, p < 0.001). The Chinese version of the ISBAR Structured Handover Observation Tool demonstrates good content validity, high inter-rater reliability, and acceptable discriminant validity. This tool provides a standardized and practical method for assessing the completeness of information transfer and is expected to support quality improvement in patient handover from the ICU to general wards in Chinese clinical settings.
da Silva Castanheira, J.; Landry, M.; Fleming, S. M.
Show abstract
Brain activity comprises both rhythmic (periodic) and arrhythmic (aperiodic) components. These signal elements vary across healthy aging, and disease, and may make distinct contributions to conscious perception. Despite pioneering techniques to parameterize rhythmic and arrhythmic neural components based on power spectra, the methodology for quantifying rhythmic activity remains in its infancy. Previous work has relied on parametric estimates of rhythmic power extracted from specparam, or estimates of rhythmic power obtained after detrending neural spectra. Variation in analytical choices for isolating brain rhythms from background arrhythmic activity makes interpreting findings across studies difficult. Whether these current approaches can accurately recover the independent contribution of these neural signal elements remains to be established. Here, using simulation and parameter recovery approaches, we show that power estimates obtained from detrended spectra conflate these two neurophysiological components, yielding spurious correlations between spectral model parameters. In contrast, modelled rhythmic power obtained from specparam, which detrends the power spectra and parametrizes brain rhythms, independently recovers the rhythmic and arrhythmic components in simulated neural time series, minimising spurious relationships. We validate these methods using resting-state recordings from a large cohort. Based on our findings, we recommend modelled rhythmic power estimates from specparam for the robust independent quantification of rhythmic and arrhythmic signal components for cognitive neuroscience.
Liu, J.; Fan, J.; Deng, Z.; Tang, X.; Zhang, H.; Sharma, A.; Li, Q.; Liang, C.; Wang, A. Y.; Liu, L.; Luo, K.; Liu, H.; Qiu, H.
Show abstract
Background: Patient-ventilator synchrony, an essential prerequisite for non-invasive mechanical ventilation, requires an accurate matching of every phase of the respiration between patient and the ventilator. Methods: We developed a long short-term memory (LSTM)-based model that can predict the inspiratory and expiratory time of the patient. This model consisted of two hidden layers, each with eight LSTM units, and was trained using a dataset of approximately 27000 of 500-ms-long flow signals that captured both inspiratory and expiratory events. Results: The LSTM model achieved 97% accuracy and F1 score in the test data, and the average trigger error was less than 2.20%. In the first trial, 10 volunteers were enrolled. In "Compliance" mode, 78.6% of the triggering by the LSTM model was compatible with neuronal respiration, which was higher than Auto-Trak model (74.2%). Auto-Trak model performed marginally better in the modes of pressure support = 5 and 10 cmH2O. Considering the success in the first clinical trial, we further tested the models by including five patients with acute respiratory distress syndrome (ARDS). The LSTM model exhibited 60.6% of the triggering in the 33%-box, which is better than 49.0% of Auto-Trak model. And the PVI index of the LSTM model was significantly less than Auto-Trak model (36.5% vs 52.9%). Conclusions: Overall, the LSTM model performed comparable to, or even better than, Auto-Trak model in both latency and PVI index. While other mathematical models have been developed, our model was effectively embedded in the chip to control the triggering of ventilator. Trial registration: Approval Number: 2023ZDSYLL348-P01; Approval Date: 28/09/2023. Clinical Trial Registration Number: ChiCTR2500097446; Registration Date: 19/02/2025.
Harikumar, A.; Baker, B.; Amen, D.; Keator, D.; Calhoun, V. D.
Show abstract
Single photon emission computed tomography (SPECT) is a highly specialized imaging modality that enables measurement of regional cerebral perfusion and, in particular, resting cerebral blood flow (rCBF). Recent technological advances have improved SPECT quantification and reliability, making it increasingly useful for studying rCBF abnormalities and perfusion network alterations in psychiatric and neurological disorders. To characterize large scale functional organization in SPECT data, data driven decomposition methods such as independent component analysis (ICA) have been used to extract covarying perfusion patterns that map onto interpretable brain networks. Blind ICA provides a data driven approach to estimate these networks without strong prior assumptions. More recently, a hybrid approach that leverages spatial priors to guide a spatially constrained ICA (sc ICA) have been used to fully automate the ICA analysis while also providing participant-specific network estimates. While this has been reliably demonstrated in fMRI with the NeuroMark template, there is currently no comparable SPECT template. A SPECT template would enable automatic estimation of functional SPECT networks with participant-specific expressions that correspond across participants and studies. The current study introduces a new replicable NeuroMark SPECT template for estimating canonical perfusion covariance patterns (networks). We first identify replicable SPECT networks using blind ICA applied to two large sample SPECT datasets. We then demonstrate the use of the resulting template by applying sc-ICA to an independent schizophrenia dataset. In sum, this work presents and shares the first NeuroMark SPECT template and demonstrating its utility in an independent cohort, providing a scalable and robust framework for network-based analyses.
Xu, M.; Philips, R.; Singavarapu, A.; Zheng, M.; Martin, D.; Nikolin, S.; Mutz, J.; Becker, A.; Firenze, R.; Tsai, L.-H.
Show abstract
Background: Gamma oscillation dysfunction has been implicated in neuropsychiatric disorders. Restoring gamma oscillations via brain stimulation represents an emerging therapeutic approach. However, the strength of its clinical effects and treatment moderators remain unclear. Method: We conducted a systematic review and meta-analysis to examine the clinical effects of gamma neuromodulation in neuropsychiatric disorders. A literature search for controlled trials using gamma stimulation was performed across five databases up until April 2025. Effect sizes were calculated using Hedge's g. Separate analyses using the random-effects model examined the clinical effects in schizophrenia (SZ), major depressive disorder (MDD), bipolar disorder, and autism spectrum disorder. For SZ and MDD, subgroup analyses evaluated the effects of stimulation modality, stimulation frequency, treatment duration, and pulses per session. Result: Fifty-six studies met the inclusion criteria (NSZ = 943, NMDD = 916, NBD = 175, NASD = 232). In SZ, gamma stimulation was associated with improvements in positive (k = 10, g = -0.60, p < 0.001), negative (k = 12, g = -0.37, p = 0.03), depressive (k = 8, g = -0.39, p < 0.001), anxious symptoms (k = 5, g = -0.59, p < 0.001), and overall cognitive function (k = 7, g = 0.55, p < 0.001). Stimulation frequency and treatment duration moderated therapeutic effects. In MDD, reductions in depressive symptoms were observed (k = 23, g = -0.34, p = 0.007). Conclusion: Gamma neuromodulation showed moderate therapeutic benefits in SZ and MDD. Substantial heterogeneity likely reflects protocol differences, highlighting the need for well-powered future trials.
Quide, Y.; Lim, T. E.; Gustin, S. M.
Show abstract
BackgroundEarly-life adversity (ELA) is a risk factor for enduring pain in youth and is associated with alterations in brain morphology and function. However, it remains unclear whether ELA-related neurobiological changes contribute to the development of enduring pain in early adolescence. MethodsUsing data from the Adolescent Brain Cognitive Development (ABCD) Study, we examined multimodal magnetic resonance imaging (MRI) markers in children assessed at baseline (ages 9-11 years) and at 2-year follow-up (ages 11-13 years). ELA exposure was defined at baseline to maximise temporal separation between early adversity and later enduring pain. Participants with enduring pain at follow-up (n = 322) were compared to matched pain-free controls (n = 644). Structural MRI, diffusion MRI (fractional anisotropy, mean diffusivity), and resting-state functional connectivity data were analysed. Linear models tested main effects of enduring pain, ELA, and their interaction on brain metrics, controlling for relevant covariates. ResultsELA exposure was associated with smaller caudate and nucleus accumbens volumes, and reduced surface area of the left rostral middle frontal gyrus. No significant effects of enduring pain or ELA-by-enduring pain interaction were observed across grey matter, white matter, or functional connectivity measures. ConclusionsELA was associated with alterations in fronto-striatal regions in late childhood, but these changes were not linked to enduring pain in early adolescence. These findings suggest that ELA-related neurobiological alterations may represent early markers of vulnerability rather than concurrent correlates of enduring pain. Longitudinal follow-up is needed to determine whether these alterations contribute to later chronic pain risk.
Spann, D. J.; Hall, L. M.; Moussa-Tooks, A.; Sheffield, J. M.
Show abstract
BackgroundNegative symptoms are core features of schizophrenia that relate strongly to functional impairment, yet interventions targeting these symptoms remain largely ineffective. Emerging theoretical work highlights how environmental factors may shape and maintain negative symptoms. Although racial disparities in schizophrenia diagnosis among Black Americans are well documented and linked to racial stress and psychosis, the impact of racial stress on negative symptoms has not been examined. This study provides an initial test of a novel theory proposing that racial stress - here measured by racial discrimination - influences negative symptom severity through exacerbation of negative cognitions about the self, particularly defeatist performance beliefs (DPB). Study DesignParticipants diagnosed with schizophrenia-spectrum disorder (SSD) (N = 208; 80 Black, 128 White) completed the Positive and Negative Syndrome Scale (PANSS), the Defeatist Beliefs Scale, and self-report measures of subjective racial and ethnic discrimination (Racial and Ethnic Minority Scale and General Ethnic Discrimination Scale). Relationships among variables were tested using linear regression and mediation analysis. Study ResultsBlack participants exhibited significantly greater total and experiential negative symptoms than White participants with no group difference in DPB. Racial discrimination explained 46% of the relationship between race and negative symptoms. Among Black participants, higher DPB were associated with greater negative symptom severity. Discrimination was positively related to both DPB and negative symptoms. DPB partially mediated the relationship between discrimination and negative symptoms. ConclusionsFindings suggest that racial stress contributes to negative symptom severity via defeatist beliefs among Black individuals, highlighting potential targets for culturally informed interventions.
Xu, J.; Parker, R. M. A.; Bowman, K.; Clayton, G. L.; Lawlor, D. A.
Show abstract
Background Higher levels of sedentary behaviour, such as leisure screen time (LST), and lower levels of physical activity are associated with diseases across multiple body systems which contribute to a large global health burden. Whether these associations are causal is unclear. The primary aim of this study is to investigate the causal effects of higher LST (given greater power) and, secondarily, lower moderate-to-vigorous intensity physical activity (MVPA), on a wide range of diseases in a hypothesis-free approach. Methods A two-sample Mendelian randomisation phenome-wide association study was conducted for the main analyses. Genetic single nucleotide polymorphisms (SNPs) were first selected as exposure genetic instruments for LST (hours of television watched per day; 117 SNPs) and MVPA (higher vs. lower; 18 SNPs) based on the genome-wide significant threshold (p < 5*10-8) from the largest relevant genome-wide association study (GWAS). For disease outcomes, we used summary results from FinnGen GWAS, including 1,719 diseases defined by hospital discharge International Classification of Diseases (ICD) codes in 453,733 European participants. For the main analyses, we used the inverse-variance weighting method with a Bonferroni corrected p-value of p [≤] 3.47*10-4. Sensitivity analyses included Steiger filtering, MR-Egger and weighted median analyses, and data from UK Biobank were used to explore replication. Findings Genetically predicted higher LST was associated with increased risk of 87 (5.1% of the 1,719) diseases. Most of these diseases were in musculoskeletal and connective tissue (n=37), genitourinary (n=12) and respiratory (n=8) systems. Genetic liability to lower MVPA was associated with six diseases: three in musculoskeletal and connective tissue and genitourinary systems (with greater risk of these diseases also identified with higher LST), and three in respiratory and genitourinary systems. Sensitivity analyses largely supported the main analyses. Results replicated in UK Biobank, where data available. Conclusions Higher levels of sedentary behaviour, and lower levels of physical activity, causally increase the risk of diseases across multiple body systems, making them promising targets for reducing multimorbidity.
Pietilainen, O.; Salonsalmi, A.; Rahkonen, O.; Lahelma, E.; Lallukka, T.
Show abstract
Objectives: Longer lifespans lead to longer time on retirement, despite the efforts to raise the retirement age. Therefore, it is important to study how the retirement years can be spent without diseases. This study examined socioeconomic and sociodemographic differences in healthy years spent on retirement. Methods: We followed a cohort of retired Finnish municipal employees (N=4231, average follow-up 15.4 years) on national administrative registers for major chronic diseases: cancer, coronary heart disease, cerebrovascular disease, diabetes, asthma or chronic obstructive pulmonary disease, dementia, mental disorders, and alcohol-related disorders. Median healthy years on retirement and age at first occurrence of illness (ICD-10 and ATC-based) in each combination of sex, occupational class, and age of retirement were predicted using Royston-Parmar models. Prevalence rates for each diagnostic group were calculated. Results: Most healthy years on retirement were spent by women having worked in semi-professional jobs who retired at age 60-62 (median predicted healthy years 11.6, 95% CI 10.4-12.7). The least healthy years on retirement were spent by men having worked in routine non-manual jobs who retired after age 62 (median predicted healthy years 6.5, 95% CI 4.4-9.5). Diabetes was slightly more common among lower occupational class women, and dementia among manual working women having retired at age 60-62. Discussion: Healthy years on retirement are not enjoyed equally by women and men and those who retire early or later. Policies aiming to increase the retirement age should consider the effects of these gaps on retirees and the equitability of those effects.
Hung, J.; Smith, A.
Show abstract
The global ambition to end the human immunodeficiency virus (HIV) epidemic requires understanding which system-level policy levers, enacted under the framework of Universal Health Coverage (UHC), are most effective in achieving both transmission reduction and diagnostic coverage. This study addresses an important evidence gap by quantifying the within-country association between measurable UHC policy indicators and the estimated rate of new HIV infections across nine Southeast Asian countries between 2013 and 2022. Employing a Fixed-Effects panel data methodology, the analysis controls for time-invariant national heterogeneity, ensuring reliable estimates of policy impact. We found that marginal changes in total current health expenditure (CHE) as a percentage of gross domestic product (GDP) were not statistically significantly associated with changes in HIV incidence. However, increases in the UHC Infectious Disease Service Coverage Index were statistically significantly associated with concurrent reductions in HIV incidence (p < 0.001), suggesting the efficacy of targeted service implementation as the principal driver of curbing new HIV infections. In addition, the UHC Reproductive, Maternal, Newborn, and Child Health Service Coverage Index exhibited a statistically significant positive association with changes in HIV incidence (p < 0.01), which is interpreted as a vital surveillance artefact resulting from expanded detection and reporting of previously undiagnosed HIV cases. Furthermore, out-of-pocket (OOP) health expenditure as a percentage of CHE showed a counter-intuitive negative association with changes in HIV incidence (p < 0.01), suggesting this metric primarily shows ongoing indirect cost burdens on the established patient cohort, or, alternatively, presents a diagnostic access barrier that results in lower case finding. These findings suggest that policymakers should prioritise investment in targeted infectious disease service efficacy over aggregate fiscal commitment and utilise integrated sexual health platforms for strengthened HIV surveillance and case identification.
Jacobsen, A. M.; Quednow, B. B.; Bavato, F.
Show abstract
ImportanceBlood neurofilament light chain (NfL) and glial fibrillary acidic protein (GFAP) are entering clinical use in neurology as markers of neuroaxonal and astrocytic injury, but their utility in psychiatry is unclear. ObjectiveTo determine whether psychiatric diagnoses are associated with altered plasma NfL and GFAP levels. Design, Setting, and ParticipantsThis population-based study examined plasma NfL and GFAP among 47,495 participants from the UK Biobank (54.0% female; 93.5% White; mean [SD] age 56.8 [8.2] years) who provided blood samples and sociodemographic and clinical data between 2006 and 2010. Normative modeling was applied to assess associations between 7 lifetime psychiatric diagnostic categories and deviations from expected NfL and GFAP levels, while accounting for neurological diagnoses, cardiometabolic burden, and substance use. Data were analyzed between July 2025 and March 2026. Main Outcomes and MeasuresDeviations in plasma NfL and GFAP levels from normative predictions. ResultsRelative to the reference population, plasma NfL levels were higher among individuals with bipolar disorder (d=0.20; 95% CI, 0.03-0.37; p=0.03), recurrent depressive disorder (d=0.23; 95% CI, 0.07-0.38; p=0.009), and depressive episodes (d=0.06; 95% CI, 0.02-0.10; p=0.01), lower among individuals with anxiety disorders (d=-0.07; 95% CI, -0.12 to -0.02; p=0.008), but did not differ in schizophrenia spectrum, stress-related, or other psychiatric disorders. Plasma GFAP levels were not elevated in any psychiatric disorders. Variability in NfL levels was greater among individuals with schizophrenia spectrum disorders (variance ratio [VR]=1.30; p=0.005), depressive episodes (VR=1.06; p=0.006), and anxiety disorders (VR=1.08; p=0.005). Variability in GFAP levels was increased only in anxiety disorders (VR=1.08; p=0.01). Plasma NfL levels exceeding percentile-based normative thresholds were more common among individuals with schizophrenia spectrum disorders, bipolar disorder, recurrent depressive disorder, and depressive episodes. Neurological diagnoses, cardiometabolic burden, and substance use were associated with plasma NfL and GFAP levels. Conclusions and RelevanceThis study provides population-level evidence of plasma NfL elevation in bipolar and depressive disorders and increased variability in schizophrenia spectrum, bipolar and depressive disorders, supporting its potential as a biomarker in psychiatry and informing its ongoing neurological applications. Plasma GFAP levels, in contrast, were largely unaltered across psychiatric disorders. Key PointsO_ST_ABSQuestionC_ST_ABSAre plasma neurofilament light chain (NfL) and glial fibrillary acidic protein (GFAP) levels altered in psychiatric disorders? FindingsIn this cohort study including 47,495 individuals, normative modeling revealed that plasma NfL levels were elevated in bipolar and depressive disorders, whereas plasma GFAP levels were not elevated in any psychiatric disorder. Plasma NfL levels also showed higher variability in schizophrenia spectrum, bipolar, and depressive disorders. MeaningPlasma NfL shows distinct alterations in schizophrenia spectrum and affective disorders, supporting its further investigation as a biomarker in clinical psychiatry and highlighting the need to consider psychiatric comorbidity in neurological applications.
Xiao, M.; Girard, Q.; Pender, M.; Rabezara, J. Y.; Rahary, P.; Randrianarisoa, S.; Rasambainarivo, F.; Rasolofoniaina, O.; Soarimalala, V.; Janko, M. M.; Nunn, C. L.
Show abstract
PurposeAntibiotic use (ABU) is a major driver of antimicrobial resistance (AMR), but ABU patterns are poorly understood in low-income countries where the burden of AMR is great and ABU is insufficiently regulated. Here, we report ABU from ten sites ranging from rural villages to small cities in Madagascar, a country with high AMR levels, and present results from modeling to identify factors that may be associated with ABU in this setting. MethodsWe conducted surveys of 290 individuals from ten sites in the SAVA Region of northeast Madagascar to gather data on sociodemographic characteristics, agricultural and animal husbandry practices, recent antibiotic use, the antibiotics that participants recalled using in their lifetimes, and the sources of their antibiotics. Using these data, we conducted statistical analyses with a mixed-effects logistic model to determine which characteristics were associated with recent antibiotic use. ResultsNearly all respondents (N=283, 97.6%) reported ABU in their lifetimes, with amoxicillin being the most widely reported antibiotic (N=255, 90.1% of those reporting ABU). All recalled antibiotics were classified as frontline drugs except for ciprofloxacin. Most respondents who reported antibiotic use also reported obtaining antibiotics without prescriptions from local stores (N=273, 96.5%), while only 52.3% (N=148) reported obtaining antibiotics through a prescriptive route, such as from a health clinic or private doctor. Of the 127 individuals (44.9%) who reported recent ABU, men were found to be significantly less likely to have recently taken antibiotics than women. ConclusionsOur findings provide new insights into ABU in agricultural settings in low-income countries, which have historically been understudied in AMR and pharmacoepidemiologic research. Knowledge of ABU patterns supports understanding of AMR dynamics and AMR control efforts in these contexts, such as interventions on inappropriate antibiotic dispensing. Key pointsO_LIAntibiotic use (ABU) in Madagascar is largely unstudied despite its role in antimicrobial resistance (AMR), which Madagascar faces a high burden of. C_LIO_LIABU was widespread among livestock owners in northeast Madagascar, with the majority of study participants reporting ABU in their lifetimes and most people reporting ABU also having taken antibiotics in the previous three months. C_LIO_LIMost respondents reported obtaining their antibiotics from non-pharmaceutical stores, indicating high levels of unregulated ABU, though more than half also reported sourcing their antibiotics through prescriptive means (like doctors and health clinics). C_LIO_LIMen were less likely than women to have taken antibiotics in the previous three months. C_LIO_LIThese findings support the development of interventions to mitigate the burden of AMR in Madagascar and similar contexts while underscoring the need for more comprehensive research on the drivers and patterns of ABU. C_LI Plain language summaryIn this study, we provide basic information on antibiotic use (ABU) patterns in Madagascar, a country that experiences high levels of resistance but has been particularly understudied in AMR and pharmacological research. We surveyed 290 farmers with livestock from ten sites across northeast Madagascar about their ABU and found that nearly all study participants (N=283, 97.6%) have used antibiotics in their lifetimes, while a little under half of those who reported ABU also reported using antibiotics in the previous three months (N=127, 44.9%). The most used antibiotic was amoxicillin (N=255, 90.1%). Most people obtained their antibiotics from sources that do not require prescriptions, like general stores, indicating that most ABU is unregulated. Through modeling, we also found that men were less likely than women to have taken antibiotics in the previous three months (OR=0.50, CI 0.30-0.82). These findings help us better understand the dynamics of ABU in low-income countries, which have historically been understudied in AMR and pharmacological research. They also support efforts to mitigate the burden of AMR by revealing ABU dynamics that may contribute to the emergence and spread of AMR, as well as identifying targets for intervention to curb inappropriate ABU.
Shaetonhodi, N. G.; De Vos, L.; Babalola, C.; de Voux, A.; Joseph Davey, D.; Mdingi, M.; Peters, R. P. H.; Klausner, J. D.; Medina-Marino, A.
Show abstract
BackgroundCurable sexually transmitted infections (STIs), including Chlamydia trachomatis, Neisseria gonorrhoeae, and Trichomonas vaginalis, remain highly prevalent among pregnant women in South Africa. Despite poor diagnostic performance in pregnancy, syndromic management remains standard care. Point-of-care (POC) screening enables aetiological diagnosis and same-visit treatment but is not yet included in national guidelines. We conducted a mixed-methods process evaluation to examine determinants of antenatal POC STI screening implementation in public facilities. MethodsThis evaluation was embedded within the three-arm Philani Ndiphile randomized trial (March 2021-February 2025) across four public clinics in the Eastern Cape. Screening used a near-POC, electricity-dependent nucleic acid amplification test with a 90-minute turnaround time. Reach, Adoption, Implementation, and Maintenance were assessed using the RE-AIM framework. Quantitative indicators included uptake of screening, treatment, and follow-up attendance. Qualitative data included in-depth interviews with 20 pregnant women and five focus group discussions with 21 research staff and government healthcare workers. The Consolidated Framework for Implementation Research guided qualitative analysis. Findings were integrated using narrative weaving. ResultsScreening uptake was high (99.0%), with treatment coverage of 95.2% at baseline and 93.5% at repeat screening. Same-day treatment was lower (50.7% and 69.8%) and varied substantially by facility, reflecting operational constraints including turnaround time, patient volume, infrastructure, and electricity. Attendance was higher when screening was integrated into routine ANC. Women valued screening for infant health, while providers recognised advantages over syndromic management but highlighted workforce, resource, and maintenance constraints. Socioeconomic factors, including transport costs, hunger, and work commitments, influenced retention and waiting. ConclusionsAntenatal POC STI screening was acceptable and achieved high treatment coverage in a research setting. However, same-day treatment was constrained by operational requirements of the testing platform. Scale-up will require workflow integration, strengthened health system capacity, and faster diagnostics suited to routine antenatal care. Key MessagesO_ST_ABSWhat is already known on this topicC_ST_ABSSyndromic management remains standard antenatal care in many low-resource settings despite failing to capture up to 89% of infections that remain asymptomatic. Point-of-care aetiological screening has demonstrated feasibility, acceptability, and potential clinical benefit in research settings, yet has not been widely adopted into national policy. Limited evidence exists on the health system requirements and contextual determinants influencing scale-up within routine public facilities. What this study addsThis mixed-methods process evaluation demonstrates high uptake and treatment coverage of antenatal POC STI screening in a trial setting, while identifying facility-level, structural, and socioeconomic factors shaping same-day treatment and retention. We show that implementation success varies substantially across clinics and depends on assay characteristics, workflow integration, human resources, infrastructure reliability, and follow-up capacity. How this study might affect research, practice or policyThese findings provide implementation-relevant evidence to inform national policy deliberations on integrating POC STI screening into antenatal care. Sustainable scale-up will require context-adapted delivery models, strengthened workforce and supply systems, faster diagnostics, and alignment with existing ANC workflows to ensure equitable and durable impact.
Areb, M.; Huybregts, L.; Tamiru, D.; Toure, M.; Biru, B.; Fall, T.; Haddis, A.; Belachew, T.
Show abstract
BackgroundThis study aimed to assess caregiver knowledge of Infant and Young Child Feeding (IYCF), child health, severe acute malnutrition (SAM) screening, and Community-Based Management of Acute Malnutrition (CMAM), its determinants, and associations with IYCF/ WaSH (water, sanitation, and hygiene) practices among caregivers of children 6-59 months with SAM in Ethiopian agrarian and pastoralist settings. MethodData were from the baseline survey of the R-SWITCH Ethiopia cluster-randomized controlled trial (cRCT), which screened [~]28,000 children aged 6-59 months and identified 686 SAM cases. Caregiver knowledge was evaluated using a validated 32-item questionnaire (Cronbachs for internal reliability) and analyzed via linear mixed-effects and Poisson regression models in Stata 17. ResultsCaregiver knowledge was positively associated with improved IYCF/WaSH practices among children aged 6-23 months with SAM, including higher minimum dietary diversity (MDD: IRR=1.50), minimum acceptable diet (MAD: IRR=1.63), and reduced zero vegetable/fruit intake (IRR=0.77), as well as MDD in children aged 24-59 months, improved water access (IRR=1.19), water treatment (IRR=2.02), and handwashing stations (IRR=1.41). Literate ({beta} = 4.1; 95% CI:1.5-6.6, p= 0.016), pregnant({beta} = 4.4; 95% CI:0.9-7.8, 0.018), having child weighing at a health post/ health center ({beta} = 8.9;95% CI:3.5-14.2,p [≤] 0.001), and higher household wealth index ({beta} = 11.8;95% CI:3.6-20.1,p= 0.005) were associated with higher knowledge, while possible depression ({beta} = -0.3;95% CI: -0.5 to 0.0, p= 0.015) was associated with lower knowledge. ConclusionCaregiver knowledge determines better IYCF/WaSH practices among children aged 6-59 months with SAM. Literacy, pregnancy, having child weighing at a health post or health center, and greater household wealth were associated with caregivers knowledge, whereas possible depression was associated with lower knowledge. Integrating context-specific caregiver education and mental health support into CMAM, GMP(Growth monitoring and promotion), and primary care services could enhance feeding/WaSH practices in Ethiopia.